Cascade-Correlation Algorithm with Trainable Activation Functions

نویسندگان

  • Fanjun Li
  • Ying Li
چکیده

According to the characteristic that higher order derivatives of some base functions can be expressed by primitive functions and lower order derivatives, cascade-correlation algorithm with tunable activation functions is proposed in this paper. The base functions and its higher order derivatives are used to construct the tunable activation functions in cascade-correlation algorithm. The parallel and series constructing schemes of the activation functions are introduced. The model can simply the neural network architecture, speed up the convergence rate and improve its generalization. The efficiency is demonstrated with the two-spiral classification and Mackay-Glass time series prediction problem.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Cascade-Correlation Learning Network with Smoothing

 A cascade correlation learning network (CCLN) is a popular supervised learning architecture that gradually grows the hidden neurons of fixed nonlinear activation functions, adding one-by-one neuron in the network during the course of training. Because of fixed activation functions the cascaded connections from the existing neurons to the new candidate neuron are required to approximate high-o...

متن کامل

A Structure Trainable Neural Network with Embedded Gating Units and Its Learning Algorithm

Abstract Many problems solved by multilayer neural networks (MLNNs) are reduced into pattern mapping. If the mapping includes several different rules, it is difficult to solve these problems by using a single MLNN with linear connection weights and continuous activation functions. In this paper, a structure trainable neural network has been proposed. The gate units are embedded, which can be tr...

متن کامل

What's Wrong with A Cascaded Correlation Learning Network: A Projection Pursuit Learning Perspective

Cascaded correlation is a popular supervised learning architecture that dynamically grows layers of hidden neurons of xed nonlinear activations (e.g., sigmoids), so that the network topology (size, depth) can be e ciently determined. Similar to a cascaded correlation learning network (CCLN), a projection pursuit learning network (PPLN) also dynamically grows the hidden neurons. Unlike a CCLN wh...

متن کامل

Comparing heterogeneous entities using artificial neural networks of trainable weighted structural components and machine-learned activation functions

To compare entities of differing types and structural components, the artificial neural network paradigm was used to cross-compare structural components between heterogeneous documents. Trainable weighted structural components were input into machine-learned activation functions of the neurons. The model was used for matching news articles and videos, where the inputs and activation functio...

متن کامل

Recurrent neural networks with trainable amplitude of activation functions

An adaptive amplitude real time recurrent learning (AARTRL) algorithm for fully connected recurrent neural networks (RNNs) employed as nonlinear adaptive filters is proposed. Such an algorithm is beneficial when dealing with signals that have rich and unknown dynamical characteristics. Following the approach from, three different cases for the algorithm are considered; a common adaptive amplitu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Computer and Information Science

دوره 4  شماره 

صفحات  -

تاریخ انتشار 2011